A second-order method for strongly convex ℓ 1 -regularization problems

نویسندگان

  • Kimon Fountoulakis
  • Jacek Gondzio
چکیده

In this paper a second-order method for solving large-scale strongly convex `1-regularized problems is developed. The proposed method is a NewtonCG (Conjugate Gradients) algorithm with backtracking line-search embedded in a doubly-continuation scheme. Worst-case iteration complexity of the proposed Newton-CG is established. Based on the analysis of Newton-CG, worstcase iteration complexity of the doubly-continuation scheme is obtained. Numerical results are presented on large-scale problems for the doublycontinuation Newton-CG algorithm, which show that the proposed secondorder method competes favourably with state-of-the-art first-order methods. In addition, `1-regularized Sparse Least-Squares problems are discussed for which a parallel block coordinate descent method stagnates.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Novel Mechanical Fault Diagnosis Scheme Based on the Convex 1-D Second-Order Total Variation Denoising Algorithm

Convex 1-D first-order total variation (TV) denoising is an effective method for eliminating signal noise, which can be defined as convex optimization consisting of a quadratic data fidelity term and a non-convex regularization term. It not only ensures strict convex for optimization problems, but also improves the sparseness of the total variation term by introducing the non-convex penalty fun...

متن کامل

Regularization Paths for Generalized Linear Models via Coordinate Descent.

We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multinomial regression problems while the penalties include ℓ(1) (the lasso), ℓ(2) (ridge regression) and mixtures of the two (the elastic net). The algorithms use cyclical coordinate descent, computed along a regularization path....

متن کامل

Cubic regularization of Newton’s method for convex problems with constraints

In this paper we derive efficiency estimates of the regularized Newton’s method as applied to constrained convex minimization problems and to variational inequalities. We study a one-step Newton’s method and its multistep accelerated version, which converges on smooth convex problems as O( 1 k3 ), where k is the iteration counter. We derive also the efficiency estimate of a second-order scheme ...

متن کامل

Optimal Newton-type methods for nonconvex smooth optimization problems

We consider a general class of second-order iterations for unconstrained optimization that includes regularization and trust-region variants of Newton’s method. For each method in this class, we exhibit a smooth, bounded-below objective function, whose gradient is globally Lipschitz continuous within an open convex set containing any iterates encountered and whose Hessian is α−Hölder continuous...

متن کامل

On Second-order Properties of the Moreau-Yosida Regularization for Constrained Nonsmooth Convex Programs

In this paper, we attempt to investigate a class of constrained nonsmooth convex optimization problems, that is, piecewise C2 convex objectives with smooth convex inequality constraints. By using the Moreau-Yosida regularization, we convert these problems into unconstrained smooth convex programs. Then, we investigate the second-order properties of the Moreau-Yosida regularization η. By introdu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Math. Program.

دوره 156  شماره 

صفحات  -

تاریخ انتشار 2016